Practical Bayesian inference using mixtures of mixtures.

نویسندگان

  • G Cao
  • M West
چکیده

Discrete mixtures of normal distributions are widely used in modeling amplitude fluctuations of electrical potentials at synapses of human and other animal nervous systems. The usual framework has independent data values yj arising as yj = mu j + xn0 + j, where the means mu j come from some discrete prior G(mu) and the unknown xno + j's and observed xj, j = 1,...,n0, are Gaussian noise terms. A practically important development of the associated statistical methods is the issue of nonnormality of the noise terms, often the norm rather than the exception in the neurological context. We have recently developed models, based on convolutions of Dirichlet process mixtures, for such problems. Explicitly, we model the noise data values xj as arising from a Dirichlet process mixture of normals, in addition to modeling the location prior G(mu) as a Dirichlet process itself. This induces a Dirichlet mixture of mixtures of normals, whose analysis may be developed using Gibbs sampling techniques. We discuss these models and their analysis, and illustrate them in the context of neurological response analysis.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian Density Estimation and Inference Using

We describe and illustrate Bayesian inference in models for density estimation using mixtures of Dirichlet processes. These models provide natural settings for density estimation, and are exempliied by special cases where data are modelled as a sample from mixtures of normal distributions. EEcient simulation methods are used to approximate various prior, posterior and predictive distributions. ...

متن کامل

Practical Aspects of Solving Hybrid Bayesian Networks Containing Deterministic Conditionals

In this paper we discuss some practical issues that arise in solving hybrid Bayesian networks that include deterministic conditionals for continuous variables. We show how exact inference can become intractable even for small networks, due to the difficulty in handling deterministic conditionals (for continuous variables). We propose some strategies for carrying out the inference task using mix...

متن کامل

Variational inference for Dirichlet process mixtures

Dirichlet process (DP) mixture models are the cornerstone of nonparametric Bayesian statistics, and the development of Monte-Carlo Markov chain (MCMC) sampling methods for DP mixtures has enabled the application of nonparametric Bayesian methods to a variety of practical data analysis problems. However, MCMC sampling can be prohibitively slow, and it is important to explore alternatives. One cl...

متن کامل

Exact Inference on Conditional Linear Γ-Gaussian Bayesian Networks

Exact inference for Bayesian Networks is only possible for quite limited classes of networks. Examples of such classes are discrete networks, conditional linear Gaussian networks, networks using mixtures of truncated exponentials, and networks with densities expressed as truncated polynomials. This paper defines another class with exact inference, based on the normal inverse gamma conjugacy. We...

متن کامل

Bayesian Inference on Mixtures of Distributions

This survey covers state-of-the-art Bayesian techniques for the estimation of mixtures. It complements the earlier Marin et al. (2005) by studying new types of distributions, the multinomial, latent class and t distributions. It also exhibits closed form solutions for Bayesian inference in some discrete setups. At last, it sheds a new light on the computation of Bayes factors via the approximat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Biometrics

دوره 52 4  شماره 

صفحات  -

تاریخ انتشار 1996